Nesterov acceleration of alternating least squares for canonical tensor decomposition: Momentum step size selection and restart mechanisms
نویسندگان
چکیده
منابع مشابه
Local Convergence of the Alternating Least Squares Algorithm for Canonical Tensor Approximation
A local convergence theorem for calculating canonical low-rank tensor approximations (PARAFAC, CANDECOMP) by the alternating least squares algorithm is established. The main assumption is that the Hessian matrix of the problem is positive definite modulo the scaling indeterminacy. A discussion, whether this is realistic, and numerical illustrations are included. Also regularization is addressed.
متن کاملRandomized Alternating Least Squares for Canonical Tensor Decompositions: Application to A PDE With Random Data
This paper introduces a randomized variation of the alternating least squares (ALS) algorithm for rank reduction of canonical tensor formats. The aim is to address the potential numerical ill-conditioning of least squares matrices at each ALS iteration. The proposed algorithm, dubbed randomized ALS, mitigates large condition numbers via projections onto random tensors, a technique inspired by w...
متن کاملSome Convergence Results on the Regularized Alternating Least-Squares Method for Tensor Decomposition
We study the convergence of the Regularized Alternating Least-Squares algorithm for tensor decompositions. As a main result, we have shown that given the existence of critical points of the Alternating Least-Squares method, the limit points of the converging subsequences of the RALS are the critical points of the least squares cost functional. Some numerical examples indicate a faster convergen...
متن کاملTensor Decompositions, Alternating Least Squares and other Tales
This work was originally motivated by a classification of tensors proposed by Richard Harshman. In particular, we focus on simple and multiple “bottlenecks”, and on “swamps”. Existing theoretical results are surveyed, some numerical algorithms are described in details, and their numerical complexity is calculated. In particular, the interest in using the ELS enhancement in these algorithms is d...
متن کاملDMS: Distributed Sparse Tensor Factorization with Alternating Least Squares
Tensors are data structures indexed along three or more dimensions. Tensors have found increasing use in domains such as data mining and recommender systems where dimensions can have enormous length and are resultingly very sparse. The canonical polyadic decomposition (CPD) is a popular tensor factorization for discovering latent features and is most commonly found via the method of alternating...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Numerical Linear Algebra with Applications
سال: 2020
ISSN: 1070-5325,1099-1506
DOI: 10.1002/nla.2297